|
In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The ''entropy of conditioned on '' is written as . == Definition == If is the entropy of the variable conditioned on the variable taking a certain value , then is the result of averaging over all possible values that may take. Given discrete random variables with domain and with domain , the conditional entropy of given is defined as: : ''Note:'' It is understood that the expressions 0 log 0 and 0 log (''c''/0) for fixed ''c''>0 should be treated as being equal to zero. if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「conditional entropy」の詳細全文を読む スポンサード リンク
|